A potential reduction method for a class of smooth convex programming problems
نویسندگان
چکیده
In this paper we propose a potential reduction method for smooth convex programming. It is assumed that the objective and constraint functions ful l the socalled Relative Lipschitz Condition, with Lipschitz constant M > 0. The great advantage of this method, above the existing path{following methods, is that it allows linesearches. In our method we do linesearches along the Newton direction with respect to a strictly convex potential function if we are far away from the central path. If we are su ciently close to this path we update a lower bound for the optimal value. We prove that the number of iterations required by the algorithm to converge to an {optimal solution is O((1 + M2)pnj ln j) or O((1 + M2)nj ln j), dependent on the updating scheme for the lower bound.
منابع مشابه
A Method for Solving Convex Quadratic Programming Problems Based on Differential-algebraic equations
In this paper, a new model based on differential-algebraic equations(DAEs) for solving convex quadratic programming(CQP) problems is proposed. It is proved that the new approach is guaranteed to generate optimal solutions for this class of optimization problems. This paper also shows that the conventional interior point methods for solving (CQP) problems can be viewed as a special case of the n...
متن کاملAugmented Lagrangian method for solving absolute value equation and its application in two-point boundary value problems
One of the most important topic that consider in recent years by researcher is absolute value equation (AVE). The absolute value equation seems to be a useful tool in optimization since it subsumes the linear complementarity problem and thus also linear programming and convex quadratic programming. This paper introduce a new method for solving absolute value equation. To do this, we transform a...
متن کاملA Projected Alternating Least square Approach for Computation of Nonnegative Matrix Factorization
Nonnegative matrix factorization (NMF) is a common method in data mining that have been used in different applications as a dimension reduction, classification or clustering method. Methods in alternating least square (ALS) approach usually used to solve this non-convex minimization problem. At each step of ALS algorithms two convex least square problems should be solved, which causes high com...
متن کاملConvex Generalized Semi-Infinite Programming Problems with Constraint Sets: Necessary Conditions
We consider generalized semi-infinite programming problems in which the index set of the inequality constraints depends on the decision vector and all emerging functions are assumed to be convex. Considering a lower level constraint qualification, we derive a formula for estimating the subdifferential of the value function. Finally, we establish the Fritz-John necessary optimality con...
متن کاملA Recurrent Neural Network for Solving Strictly Convex Quadratic Programming Problems
In this paper we present an improved neural network to solve strictly convex quadratic programming(QP) problem. The proposed model is derived based on a piecewise equation correspond to optimality condition of convex (QP) problem and has a lower structure complexity respect to the other existing neural network model for solving such problems. In theoretical aspect, stability and global converge...
متن کامل